The following report was written by University of Vermont environmental studies major and undergraduate senior Kate Bossert, who is interning with UKR-TAZ as a researcher on Ukraine-related environmental and technology issues.
The pros and cons of new technologies have been debated for as long as such technologies have existed. Information technologies, with Artificial Intelligence or “generative AI” being the latest, are no exception. In the context of Russia’s full-scale invasion of Ukraine, AI-based technologies have emerged as a new frontier for violence, psychological torture, misinformation, and the destruction of human life. This is taking place within what some have called “the twenty-first century’s ‘first cyber world war’.”
As social media have become a key source of news, the ease with which it spreads misinformation – with the “false but interesting” story often trumping the “true but boring” story – has created “serious consequences for what counts as journalism and what gets conflated with the truth.” Determining the real from the fake has become more challenging with AI, as users are able to manipulate audio clips, photos, and videos to create appearances that bear no relationship to reality. As public health experts have warned, “AI-fueled misinformation and disinformation serves to polarize society and create a harmful online environment.” AI generated misinformation has been influencing elections (e.g., see here) and encouraging a rise in climate change denial, with users releasing content designed to outrage and polarize viewers, in addition to making money for their purveyors.
Russian cyber attackers operate on the same principles, with the additional goal of dampening Western governments’ support for Ukraine. Dina Temple-Raston’s Click Here podcast has covered this repeatedly (e.g., here and here), as have Microsoft Threat Intelligence reports on Russian threat actors and Russian influence and cyber operations, among other sources. Russian cyber actors have stolen data from Ukrainian firms that track crop yields, used techniques such as “password spraying” to bypass login into Ukraine accounts, broken into Ukrainian private signal chats, successfully attacked Ukrainian service providers, and obtained access to the location of Ukrainian military units through digital means.
But perhaps none of these methods of technical destruction stand close to the damage caused by the intentional misinformation spread by these actors. The best known current spreader of such misinformation is the Russian troll farm known as Doppelgänger (and see here, here, and here). Doppelgänger’s agents create fake social media personas and websites that look as official and reliable as the Washington Post or the New York Times, but which hijack viral conversations to promote Russian propaganda. They do this especially well among their prime target audience of right-leaning politicians and their supporters.
Many attempts have been made to slander Ukrainian president Volodymyr Zelenskyy. Russian sources falsely claimed that Zelenskyy purchased two luxury yachts while his country has been at war, and that he is a drug addict; they even attached links to substance abuse recovery centers on Zelenskyy’s official social media. When Russian military forces destroyed Ukrainian grain supplies that could have fed over one million people for over a year, Russian media spread misinformation that the Ukrainian grain corridor was a drug trafficking center and a place to transfer weapons.
Apart from social media posts and comments and attachments, as Microsoft Threat Intelligence reports, “Russia-affiliated influence networks have shifted their focus on using video as a more dynamic medium to spread their misinformation.” In March, 2024, Russian state television broadcast a fake interview alleging Ukrainian involvement in a terrorist attack, featuring a Ukrainian official. As the quality and believability of fake audios and videos continue to improve, people are more likely to believe that this kind of content is real.
As the Israel-Hamas conflict has taken attention away from Ukraine, Russian operators have sought geopolitical advantage by pitting the two wars against each other. Near the beginning of the conflict, Doppelgänger “pushed the false claim that Hamas acquired Ukrainian weapons on the black market for its October 7 attacks in Israel.”
Russian propaganda has influenced Republican members of the U.S. Congress, with Republican congressional representatives repeating Russian propaganda on the house floor. Among other things, these right-wing reps cited the false allegations about Zelenskyy’s purported yachts as a reason to not send aid to Ukraine. Under Trump’s presidency, U.S. funding and aid for Ukraine was halted for six months, fueling Russia’s mission to make Ukrainains feel “friendless and alone.”
It may seem as though Russia has successfully dominated Ukraine in the online sphere, but this is not exactly the case, as Ukrainian cyber operators work diligently to counter Russian initiatives. However, just as the Russian army continues to launch missile attacks deep into Ukraine — while Ukraine is constrained by its allies to not do return fire into Russian territory — so do Russian cyber forces act with much greater impunity.
Microsoft’s Threat Intelligence report shows “that actors linked to Russia’s military and foreign intelligence agencies targeted and breached Ukrainian legal and investigative networks, and those of international organizations involved in war crimes investigations, throughout the spring and summer this year. These cyber operations occurred amid mounting tensions” between Moscow and international institutions like the International Criminal Court (ICC), which earlier this year issued an arrest warrant for Russian president Vladimir Putin on charges of war crimes.
Western governments and tech companies have sought to implement AI-driven defenses against Russian cyber threats, but these efforts have so far been insufficient, as Russian cyber operators continue to troll the internet and face no consequences for their actions.
Related reading
Barnes, J. E. (2023, August 25). Russia Pushes Long-Term Influence Operations Aimed at the U.S. and Europe. New York Times. https://www.nytimes.com/2023/08/25/us/politics/russia-intelligence-propaganda.html
Digital Forensic Research Lab. (2024, February 29). Undermining Ukraine: How Russia widened its global information war in 2023. Atlantic Council report. https://www.atlanticcouncil.org/in-depth-research-reports/report/undermining-ukraine-how-russia-widened-its-global-information-war-in-2023/
Fleming, M. (2024, April 25). How can we tackle AI-fueled misinformation and disinformation in public health?. Boston University Center on Emerging Infectious Diseases. https://www.bu.edu/ceid/2024/04/25/how-can-we-tackle-ai-fueled-misinformation-and-disinformation-in-public-health/
Garner, I. (2024, March 9). The West is still oblivious to Russia’s information war. Foreign Policy. https://foreignpolicy.com/2024/03/09/russia-putin-disinformation-propaganda-hybrid-war/
Giles, K. (2023, December 14). Russian cyber and information warfare in practice. Chatham House. https://www.chathamhouse.org/2023/12/russian-cyber-and-information-warfare-practice
Goldin, M. (2023, June 8). Ukraine’s Zelenskyy did not purchase two luxury yachts in October. They’re still up for sale. AP News. https://apnews.com/article/fact-check-zelenskyy-luxury-yachts-75-million-067680385163
Goujard, C. (2024, April 17). Big, bold and unchecked: Russian influence operation thrives on Facebook. Politico. https://www.politico.eu/article/russia-influence-hackers-social-media-facebok-operation-thriving/
Kagubare, I. (2023, December 20). How the US has helped counter destructive Russian cyberattacks amid Ukraine war. The Hill. https://thehill.com/policy/cybersecurity/3769534-how-the-us-has-helped-counter-destructive-russian-cyberattacks-amid-ukraine-war/
McGee-Abe, J. (2023, September 11). One Year on: 10 technologies used in the war in Ukraine. TechInformed. https://techinformed.com/one-year-on-10-technologies-used-in-the-war-in-ukraine/#:~:text=Tech%20has%20also%20played%20a,“first%20cyber%20world%20war”.
Micich, A., & Cross, R. J. (2024, March 8). How misinformation on social media has changed news. U.S. PIRG Education Fund. https://pirg.org/edfund/articles/misinformation-on-social-media
Microsoft. (2023, December 11). Russian threat actors dig in, prepare to seize on war fatigue. MySecurity Marketplace. https://mysecuritymarketplace.com/reports/russian-threat-actors-dig-in-prepare-to-seize-on-war-fatigue/
Sandvik, M. (Host). (2024, May 7). Taking aim at Democracy: Russia’s Doppelganger gang isn’t just targeting elections anymore [Audio podcast]. Recorded Future News. https://podcasts.apple.com/us/podcast/click-here/id1225077306
Swenson, A., & Chan, K. (2024, March 21). Election disinformation takes a big leap with AI being used to deceive worldwide. AP News. https://apnews.com/article/artificial-intelligence-elections-disinformation-chatgpt-bc283e7426402f0b4baa7df280a4c3fd#
Townsend, K. (2024, May 23). Russia’s psychological warfare against Ukraine. The Atlantic. https://www.theatlantic.com/podcasts/archive/2024/05/russias-psychological-warfare-against-ukraine/678459/
Watts, C. (2023, December 7). Russian influence and cyber operations adapt for long haul and exploit war fatigue. Microsoft On the Issues. https://blogs.microsoft.com/on-the-issues/2023/12/07/russia-ukraine-digital-threat-celebrity-cameo-mtac/
Wikipedia. Disinformation in the Russian invasion of Ukraine. https://en.wikipedia.org/wiki/Disinformation_in_the_Russian_invasion_of_Ukraine, accessed June 26, 2024.
Wilson Center. (2024, May 7). Influence operations and Russia’s vision of the future. Interview with Catherine Belton. https://www.wilsoncenter.org/audio/influence-operations-and-russias-vision-future
Leave a Reply